Feature Selection Using Grey Wolf Optimization with Random Differential Grouping

نویسندگان

چکیده

Big data are regarded as a tremendous technology for processing huge variety of in short time and with large storage capacity. The user’s access over the internet creates massive internet. require an intelligent feature selection model by addressing varieties data. Traditional techniques only applicable to simple mining. Intelligent needed big machine learning efficient classification. Major algorithms read input features they are. Then, preprocessed classified. Here, algorithm does not consider relatedness. During selection, all misread outputs. Accordingly, less optimal solution is achieved. In our proposed research, we focus on using supervised called grey wolf optimization (GWO) decomposed random differential grouping (DrnDG-GWO). First, decomposition into subsets based relatedness variables performed. Random performed fitness value two variables. Now, every subset population GWO techniques. combination swarm intelligence produces best results this research. Once optimized, classify advanced kNN process accurate result DrnDG-GWO compared those standard PSO compare efficiency algorithm. accuracy complexity 98% 5 s, which better than existing

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Binary grey wolf optimization approaches for feature selection

In this work, a novel binary version of the grey wolf optimization (GWO) is proposed and used to select optimal feature subset for classification purposes. Grey wolf optimizer (GWO) is one of the latest bioinspired optimization techniques, which simulate the hunting process of grey wolves in nature. The binary version introduced here is performed using two different approaches. In the first app...

متن کامل

An Enhanced Grey Wolf Optimization Based Feature Selection Wrapped Kernel Extreme Learning Machine for Medical Diagnosis

In this study, a new predictive framework is proposed by integrating an improved grey wolf optimization (IGWO) and kernel extreme learning machine (KELM), termed as IGWO-KELM, for medical diagnosis. The proposed IGWO feature selection approach is used for the purpose of finding the optimal feature subset for medical data. In the proposed approach, genetic algorithm (GA) was firstly adopted to g...

متن کامل

Structural Damage Assessment Via Model Updating Using Augmented Grey Wolf Optimization Algorithm (AGWO)

Some civil engineering-based infrastructures are planned for the Structural Health Monitoring (SHM) system based on their importance. Identifiction and detecting damage automatically at the right time are one of the major objectives this system faces. One of the methods to meet this objective is model updating whit use of optimization algorithms in structures.This paper is aimed to evaluate the...

متن کامل

An Improved Bat Algorithm with Grey Wolf Optimizer for Solving Continuous Optimization Problems

Metaheuristic algorithms are used to solve NP-hard optimization problems. These algorithms have two main components, i.e. exploration and exploitation, and try to strike a balance between exploration and exploitation to achieve the best possible near-optimal solution. The bat algorithm is one of the metaheuristic algorithms with poor exploration and exploitation. In this paper, exploration and ...

متن کامل

Using Fuzzy Dependency-Guided Attribute Grouping in Feature Selection

Feature selection has become a vital step in many machine learning techniques due to their inability to handle high dimensional descriptions of input features. This paper demonstrates the applicability of fuzzy-rough attribute reduction and fuzzy dependencies to the problem of learning classifiers, resulting in simpler rules with little loss in classification accuracy.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computer systems science and engineering

سال: 2022

ISSN: ['0267-6192']

DOI: https://doi.org/10.32604/csse.2022.020487